86 research outputs found

    Consistency of Topological Moves Based on the Balanced Minimum Evolution Principle of Phylogenetic Inference

    Get PDF
    Many phylogenetic algorithms search the space of possible trees using topological rearrangements and some optimality criterion. FastME is such an approach that uses the balanced minimum evolution (BME) principle, which computer studies have demonstrated to have high accuracy. FastME includes two variants: balanced subtree prune and regraft (BSPR) and balanced nearest neighbor interchange (BNNI). These algorithms take as input a distance matrix and a putative phylogenetic tree. The tree is modified using SPR or NNI operations, respectively, to reduce the BME length relative to the distance matrix, until a tree with (locally) shortest BME length is found. Following computer simulations, it has been conjectured that BSPR and BNNI are consistent, i.e. for an input distance that is a tree-metric, they converge to the corresponding tree. We prove that the BSPR algorithm is consistent. Moreover, even if the input contains small errors relative to a tree-metric, we show that the BSPR algorithm still returns the corresponding tree. Whether BNNI is consistent remains open

    Generalized Buneman pruning for inferring the most parsimonious multi-state phylogeny

    Full text link
    Accurate reconstruction of phylogenies remains a key challenge in evolutionary biology. Most biologically plausible formulations of the problem are formally NP-hard, with no known efficient solution. The standard in practice are fast heuristic methods that are empirically known to work very well in general, but can yield results arbitrarily far from optimal. Practical exact methods, which yield exponential worst-case running times but generally much better times in practice, provide an important alternative. We report progress in this direction by introducing a provably optimal method for the weighted multi-state maximum parsimony phylogeny problem. The method is based on generalizing the notion of the Buneman graph, a construction key to efficient exact methods for binary sequences, so as to apply to sequences with arbitrary finite numbers of states with arbitrary state transition weights. We implement an integer linear programming (ILP) method for the multi-state problem using this generalized Buneman graph and demonstrate that the resulting method is able to solve data sets that are intractable by prior exact methods in run times comparable with popular heuristics. Our work provides the first method for provably optimal maximum parsimony phylogeny inference that is practical for multi-state data sets of more than a few characters.Comment: 15 page

    Measurement of the Charged Multiplicities in b, c and Light Quark Events from Z0 Decays

    Full text link
    Average charged multiplicities have been measured separately in bb, cc and light quark (u,d,su,d,s) events from Z0Z^0 decays measured in the SLD experiment. Impact parameters of charged tracks were used to select enriched samples of bb and light quark events, and reconstructed charmed mesons were used to select cc quark events. We measured the charged multiplicities: nˉuds=20.21±0.10(stat.)±0.22(syst.)\bar{n}_{uds} = 20.21 \pm 0.10 (\rm{stat.})\pm 0.22(\rm{syst.}), nˉc=21.28±0.46(stat.)−0.36+0.41(syst.)\bar{n}_{c} = 21.28 \pm 0.46(\rm{stat.}) ^{+0.41}_{-0.36}(\rm{syst.}) nˉb=23.14±0.10(stat.)−0.37+0.38(syst.)\bar{n}_{b} = 23.14 \pm 0.10(\rm{stat.}) ^{+0.38}_{-0.37}(\rm{syst.}), from which we derived the differences between the total average charged multiplicities of cc or bb quark events and light quark events: Δnˉc=1.07±0.47(stat.)−0.30+0.36(syst.)\Delta \bar{n}_c = 1.07 \pm 0.47(\rm{stat.})^{+0.36}_{-0.30}(\rm{syst.}) and Δnˉb=2.93±0.14(stat.)−0.29+0.30(syst.)\Delta \bar{n}_b = 2.93 \pm 0.14(\rm{stat.})^{+0.30}_{-0.29}(\rm{syst.}). We compared these measurements with those at lower center-of-mass energies and with perturbative QCD predictions. These combined results are in agreement with the QCD expectations and disfavor the hypothesis of flavor-independent fragmentation.Comment: 19 pages LaTex, 4 EPS figures, to appear in Physics Letters

    Risk profiles and one-year outcomes of patients with newly diagnosed atrial fibrillation in India: Insights from the GARFIELD-AF Registry.

    Get PDF
    BACKGROUND: The Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF) is an ongoing prospective noninterventional registry, which is providing important information on the baseline characteristics, treatment patterns, and 1-year outcomes in patients with newly diagnosed non-valvular atrial fibrillation (NVAF). This report describes data from Indian patients recruited in this registry. METHODS AND RESULTS: A total of 52,014 patients with newly diagnosed AF were enrolled globally; of these, 1388 patients were recruited from 26 sites within India (2012-2016). In India, the mean age was 65.8 years at diagnosis of NVAF. Hypertension was the most prevalent risk factor for AF, present in 68.5% of patients from India and in 76.3% of patients globally (P < 0.001). Diabetes and coronary artery disease (CAD) were prevalent in 36.2% and 28.1% of patients as compared with global prevalence of 22.2% and 21.6%, respectively (P < 0.001 for both). Antiplatelet therapy was the most common antithrombotic treatment in India. With increasing stroke risk, however, patients were more likely to receive oral anticoagulant therapy [mainly vitamin K antagonist (VKA)], but average international normalized ratio (INR) was lower among Indian patients [median INR value 1.6 (interquartile range {IQR}: 1.3-2.3) versus 2.3 (IQR 1.8-2.8) (P < 0.001)]. Compared with other countries, patients from India had markedly higher rates of all-cause mortality [7.68 per 100 person-years (95% confidence interval 6.32-9.35) vs 4.34 (4.16-4.53), P < 0.0001], while rates of stroke/systemic embolism and major bleeding were lower after 1 year of follow-up. CONCLUSION: Compared to previously published registries from India, the GARFIELD-AF registry describes clinical profiles and outcomes in Indian patients with AF of a different etiology. The registry data show that compared to the rest of the world, Indian AF patients are younger in age and have more diabetes and CAD. Patients with a higher stroke risk are more likely to receive anticoagulation therapy with VKA but are underdosed compared with the global average in the GARFIELD-AF. CLINICAL TRIAL REGISTRATION-URL: http://www.clinicaltrials.gov. Unique identifier: NCT01090362

    Immune plexins and semaphorins: old proteins, new immune functions

    Get PDF
    Plexins and semaphorins are a large family of proteins that are involved in cell movement and response. The importance of plexins and semaphorins has been emphasized by their discovery in many organ systems including the nervous (Nkyimbeng-Takwi and Chapoval, 2011; McCormick and Leipzig, 2012; Yaron and Sprinzak, 2012), epithelial (Miao et al., 1999; Fujii et al., 2002), and immune systems (Takamatsu and Kumanogoh, 2012) as well as diverse cell processes including angiogenesis (Serini et al., 2009; Sakurai et al., 2012), embryogenesis (Perala et al., 2012), and cancer (Potiron et al., 2009; Micucci et al., 2010). Plexins and semaphorins are transmembrane proteins that share a conserved extracellular semaphorin domain (Hota and Buck, 2012). The plexins and semaphorins are divided into four and eight subfamilies respectively based on their structural homology. Semaphorins are relatively small proteins containing the extracellular semaphorin domain and short intra-cellular tails. Plexins contain the semaphorin domain and long intracellular tails (Hota and Buck, 2012). The majority of plexin and semaphorin research has focused on the nervous system, particularly the developing nervous system, where these proteins are found to mediate many common neuronal cell processes including cell movement, cytoskeletal rearrangement, and signal transduction (Choi et al., 2008; Takamatsu et al., 2010). Their roles in the immune system are the focus of this review

    Proceedings of the 2016 Childhood Arthritis and Rheumatology Research Alliance (CARRA) Scientific Meeting

    Get PDF

    The MinMax Squeeze: Guaranteeing a minimal tree for population data

    No full text
    We report that for population data, where sequences are very similar to one another, it is often possible to use a two-pronged (MinMax Squeeze) approach to prove that a tree is the shortest possible under the parsimony criterion. Such population data can be in a range where parsimony is a maximum likelihood estimator. This is in sharp contrast to the case with species data, where sequences are much further apart and the problem of guaranteeing an optimal phylogenetic tree is known to be computationally prohibitive for realistic numbers of species, irrespective of whether likelihood or parsimony is the optimality criterion. The Squeeze uses both an upper bound (the length of the shortest tree known) and a lower bound derived from partitions of the columns (the length of the shortest tree possible). If the two bounds meet, the shortest known tree is thus proven to be a shortest possible tree. The implementation is first tested on simulated data sets and then applied to 53 complete human mitochondrial genomes. The shortest possible trees for those data have several significant improvements from the published tree. Namely, a pair of Australian lineages comes deeper in the tree (in agreement with archaeological data), and the non-African part of the tree shows greater agreement with the geographical distribution of lineages

    Six points suffice: How to check for metric consistency

    Get PDF
    In many areas of data analysis, it is desirable to have tools at hand for analyzing the structure of distance tables—or, in more mathematical terms, of finite metric spaces. One such tool, known as split decomposition theory has proven particularly useful in this respect. The class of so-called totally decomposable metrics forms a cornerstone for this theory, and much work has been devoted to their study. Recently, it has become apparent that a particular subclass of these metrics, the consistent metrics, are also of fundamental importance. In this paper, we give a six-point characterization of consistent metrics amongst the totally decomposable ones
    • 

    corecore